Convexity of Proper Composite Binary Losses

نویسندگان

  • Mark D. Reid
  • Robert C. Williamson
چکیده

A composite loss assigns a penalty to a realvalued prediction by associating the prediction with a probability via a link function then applying a class probability estimation (CPE) loss. If the risk for a composite loss is always minimised by predicting the value associated with the true class probability the composite loss is proper. We provide a novel, explicit and complete characterisation of the convexity of any proper composite loss in terms of its link and its “weight function” associated with its proper CPE loss.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Composite Binary Losses

We study losses for binary classification and class probability estimation and extend the understanding of them from margin losses to general composite losses which are the composition of a proper loss with a link function. We characterise when margin losses can be proper composite losses, explicitly show how to determine a symmetric loss in full from half of one of its partial losses, introduc...

متن کامل

Composite Multiclass Losses

We consider loss functions for multiclass prediction problems. We show when a multiclass loss can be expressed as a “proper composite loss”, which is the composition of a proper loss and a link function. We extend existing results for binary losses to multiclass losses. We subsume results on “classification calibration” by relating it to properness. We determine the stationarity condition, Breg...

متن کامل

The Convexity and Design of Composite Multiclass Losses

We consider composite loss functions for multiclass prediction comprising a proper (i.e., Fisherconsistent) loss over probability distributions and an inverse link function. We establish conditions for their (strong) convexity and explore the implications. We also show how the separation of concerns afforded by using this composite representation allows for the design of families of losses with...

متن کامل

Surrogate Regret Bounds for the Area Under the ROC Curve via Strongly Proper Losses

The area under the ROC curve (AUC) is a widely used performance measure in machine learning, and has been widely studied in recent years particularly in the context of bipartite ranking. A dominant theoretical and algorithmic framework for AUC optimization/bipartite ranking has been to reduce the problem to pairwise classification; in particular, it is well known that the AUC regret can be form...

متن کامل

Consistency of Surrogate Risk Minimization Methods for Binary Classification using Classification Calibrated Losses

In the previous lecture, we saw that for a λ−strongly proper composite loss ψ, it is possible to bound the 0 − 1 regret in terms of its ψ−regret. Hence, for λ−strongly proper composite loss ψ, if we have a ψ− consistent algorithm, we can use it to obtain a 0 − 1 consistent algorithm. However, not all loss functions used as surrogates in binary classification are proper, the hinge loss being one...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010